165 research outputs found

    Assessing haptic properties for data representation

    Get PDF
    This paper describes the results of a series of forced choice design experiments investigating the discrimination of material properties using a PHANToM haptic device. Research has shown that the PHANToM is effective at displaying graphical information to blind people, but the techniques used so far have been very simple. Our experiments showed that subjects' discrimination of friction was significantly better than that of stiffness or the spatial period of sinusoidal textures, over the range of stimuli investigated. Thus, it is proposed that graphical data could be made more easily accessible to blind users by scaling the data values to friction rather than shape or size, as in traditional bar charts

    Tac-tiles: multimodal pie charts for visually impaired users

    Get PDF
    Tac-tiles is an accessible interface that allows visually impaired users to browse graphical information using tactile and audio feedback. The system uses a graphics tablet which is augmented with a tangible overlay tile to guide user exploration. Dynamic feedback is provided by a tactile pin-array at the fingertips, and through speech/non-speech audio cues. In designing the system, we seek to preserve the affordances and metaphors of traditional, low-tech teaching media for the blind, and combine this with the benefits of a digital representation. Traditional tangible media allow rapid, non-sequential access to data, promote easy and unambiguous access to resources such as axes and gridlines, allow the use of external memory, and preserve visual conventions, thus promoting collaboration with sighted colleagues. A prototype system was evaluated with visually impaired users, and recommendations for multimodal design were derived

    Sonically enhanced interface toolkit

    Get PDF
    This paper describes an on-going research project investigating the design of a user-interface toolkit composed of sonically enhanced widgets. The motivation for this work is the same that motivated the creation of graphical interface toolkits, which was to simplify their construction, allowing designers who are not experts to create such interfaces; to ensure the sonically enhanced widgets are effective and improve usability; and to ensure the widgets use sound in a clear and consistent way across the interface

    Sound in the interface to a mobile computer

    Get PDF
    No abstract available

    Impact of haptic 'touching' technology on cultural applications

    Get PDF
    No abstract available

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    Correcting menu usability problems with sound

    Get PDF
    Future human-computer interfaces will use more than just graphical output to display information. In this paper we suggest that sound and graphics together can be used to improve interaction. We describe an experiment to improve the usability of standard graphical menus by the addition of sound. One common difficulty is slipping off a menu item by mistake when trying to select it. One of the causes of this is insufficient feedback. We designed and experimentally evaluated a new set of menus with much more salient audio feedback to solve this problem. The results from the experiment showed a significant reduction in the subjective effort required to use the new sonically-enhanced menus along with significantly reduced error recovery times. A significantly larger number of errors were also corrected with sound

    Effects of feedback, mobility and index of difficulty on deictic spatial audio target acquisition in the horizontal plane

    Get PDF
    We present the results of an empirical study investigating the effect of feedback, mobility and index of difficulty on a deictic spatial audio target acquisition task in the horizontal plane in front of a user. With audio feedback, spatial audio display elements are found to enable usable deictic interac-tion that can be described using Fitts law. Feedback does not affect perceived workload or preferred walking speed compared to interaction without feedback. Mobility is found to degrade interaction speed and accuracy by 20%. Participants were able to perform deictic spatial audio target acquisition when mobile while walking at 73% of their pre-ferred walking speed. The proposed feedback design is ex-amined in detail and the effects of variable target widths are quantified. Deictic interaction with a spatial audio display is found to be a feasible solution for future interface designs

    An environment for studying the impact of spatialising sonified graphs on data comprehension

    Get PDF
    We describe AudioCave, an environment for exploring the impact of spatialising sonified graphs on a set of numerical data comprehension tasks. Its design builds on findings regarding the effectiveness of sonified graphs for numerical data overview and discovery by visually impaired and blind students. We demonstrate its use as a test bed for comparing the approach of accessing a single sonified numerical datum at a time to one where multiple sonified numerical data can be accessed concurrently. Results from this experiment show that concurrent access facilitates the tackling of our set multivariate data comprehension tasks. AudioCave also demonstrates how the spatialisation of the sonified graphs provides opportunities for sharing the representation. We present two experiments investigating users solving set data comprehension tasks collaboratively by sharing the data representation

    Non-visual overviews of complex data sets

    Get PDF
    This paper describes the design and preliminary testing of an interface to obtain overview information from complex numerical data tables non-visually, which is something that cannot be done with currently available accessibility tools for the blind and visually impaired users. A sonification technique that hides detail in the data and highlights its main features without doing any computations to the data, is combined with a graphics tablet for focus+context interactive navigation, in an interface called TableVis. Results from its evaluation suggest that this technique can deliver better scores than speech in time to answer overview questions, correctness of the answers and subjective workload
    • …
    corecore